Hypercube estimators: Penalized least squares, submodel selection, and numerical stability

نویسنده

  • Rudolf Beran
چکیده

Hypercube estimators for the mean vector in a general linear model include algebraic equivalents to penalized least squares estimatorswith quadratic penalties and to submodel least squares estimators. Penalized least squares estimators necessarily break down numerically for certain penalty matrices. Equivalent hypercube estimators resist this source of numerical instability. Under conditions, adaptation over a class of candidate hypercube estimators, so as to minimize the estimated quadratic risk, also minimizes the asymptotic risk under the general linear model. Numerical stability of hypercube estimators assists trustworthy adaptation. Hypercube estimators have broad applicability to any statistical methodology that involves penalized least squares. Notably, they extend to general designs the risk reduction achieved by Stein’s multiple shrinkage estimators for balanced observations on an array of means. © 2013 Elsevier B.V. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularization of Wavelet Approximations

In this paper, we introduce nonlinear regularized wavelet estimators for estimating nonparametric regression functions when sampling points are not uniformly spaced. The approach can apply readily to many other statistical contexts. Various new penalty functions are proposed. The hard-thresholding and soft-thresholding estimators of Donoho and Johnstone are speciŽ c members of nonlinear regular...

متن کامل

Rank-based variable selection

This note considers variable selection in the robust linear model via R-estimates. The proposed rank-based approach is a generalization of the penalized least squares estimators where we replace the least squares loss function with Jaeckel’s (1972) dispersion function. Our rank-based method is robust to outliers in the errors and has roots in traditional nonparametric statistics for simple loca...

متن کامل

Model and Variable Selection Procedures for Semiparametric Time Series Regression

Semiparametric regression models are very useful for time series analysis. They facilitate the detection of features resulting from external interventions. The complexity of semiparametric models poses new challenges for issues of nonparametric and parametric inference and model selection that frequently arise from time series data analysis. In this paper, we propose penalized least squares est...

متن کامل

Asymptotic oracle properties of SCAD-penalized least squares estimators

We study the asymptotic properties of the SCAD-penalized least squares estimator in sparse, high-dimensional, linear regression models when the number of covariates may increase with the sample size. We are particularly interested in the use of this estimator for simultaneous variable selection and estimation. We show that under appropriate conditions, the SCAD-penalized least squares estimator...

متن کامل

Hardness on Numerical Realization of Some Penalized Likelihood Estimators

Abstract: We show that with a class of penalty functions, numerical problems associated with the implementation of the penalized least square estimators are equivalent to the exact cover by 3-sets problem, which belongs to a class of NP-hard problems. We then extend this NP-hardness result to the cases of penalized least absolute deviation regression and penalized support vector machines. We di...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Computational Statistics & Data Analysis

دوره 71  شماره 

صفحات  -

تاریخ انتشار 2014